Kernel discriminant analysis based feature selection
نویسندگان
چکیده
منابع مشابه
Kernel discriminant analysis based feature selection
For two-class problems we propose two feature selection criteria based on kernel discriminant analysis (KDA). The first one is the objective function of kernel discriminant analysis called the KDA criterion. We show that the KDA criterion is monotonic for the deletion of features, which ensures stable feature selection. The second one is the recognition rate obtained by a KDA classifier, called...
متن کاملFeature space locality constraint for kernel based nonlinear discriminant analysis
Subspace learning is an important approach in pattern recognition. Nonlinear discriminant analysis (NDA), due to its capability of describing nonlinear manifold structure of samples, is considered to be more powerful to undertake classification tasks in image related problems. In kernel based NDA representation, there are three spaces involved, i.e., original data space, implicitly mapped high ...
متن کاملOptimal feature sub-space selection based on discriminant analysis
The performance of a speech recogniser, or of any other pattern classifier, strongly depends on the input features: to obtain a good performance, the feature set needs to be both highly discriminative and compact. Linear discriminant analysis (LDA) is a common data-driven method used to find linear transformations that map large feature vectors onto smaller ones while retaining most of the disc...
متن کاملDiscriminant Analysis for Unsupervised Feature Selection
Feature selection has been proven to be efficient in preparing high dimensional data for data mining and machine learning. As most data is unlabeled, unsupervised feature selection has attracted more and more attention in recent years. Discriminant analysis has been proven to be a powerful technique to select discriminative features for supervised feature selection. To apply discriminant analys...
متن کاملVariable selection in kernel Fisher discriminant analysis by means of recursive feature elimination
Variable selection serves a dual purpose in statistical classification problems: it enables one to identify the input variables which separate the groups well, and a classification rule based on these variables frequently has a lower error rate than the rule based on all the input variables. Kernel Fisher discriminant analysis (KFDA) is a recently proposed powerful classification procedure, fre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2008
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2008.02.018